Shannon Entropy and Mutual Information for Multivariate SkewElliptical Distributions
نویسندگان
چکیده
The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study in detail the cases of the multivariate skew-normal and skew-t distributions. We implement our findings to the application of the optimal design of an ozone monitoring station network in Santiago de Chile.
منابع مشابه
Some Results Based on Entropy Properties of Progressive Type-II Censored Data
In many life-testing and reliability studies, the experimenter might not always obtain complete information on failure times for all experimental units. One of the most common censoring schemes is progressive type-II censoring. The aim of this paper is characterizing the parent distributions based on Shannon entropy of progressive type-II censored order statistics. It is shown that the equality...
متن کاملShannon entropy in generalized order statistics from Pareto-type distributions
In this paper, we derive the exact analytical expressions for the Shannon entropy of generalized orderstatistics from Pareto-type and related distributions.
متن کاملDynamic Bayesian Information Measures
This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...
متن کاملPerformance comparison of new nonparametric independent component analysis algorithm for different entropic indexes
Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...
متن کاملRelations Between Conditional Shannon Entropy and Expectation of $\ell_{\alpha}$-Norm
The paper examines relationships between the conditional Shannon entropy and the expectation of `α-norm for joint probability distributions. More precisely, we investigate the tight bounds of the expectation of `α-norm with a fixed conditional Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the conditional Shannon entropy and several informati...
متن کامل